Show the code to obtain porosity information from a from the Multistripe Laser Triangulatio (.asc) files. This code is meant to improve reproducibility of image processing once it has been converted from .asc files to .tiff files.
An advantage of this code for rotation vs using the rotate function in ImageJ is that we have three different tiff files and here we can ensure the same rotation if they are ever needed to be worked as a raster brick, and ensure that it lines up.
The advantage of cropping here is having control and understanding of how much was cropped from each side and keeping consistence across all three tiff files.
The method consists on:
Using the Aggregating_asc function in 20220405_JP_DynamicporosityfromMLT_RscriptFunctions.R that changes the point cloud in the .asc files into a matrix. Each .asc file takes between 20 and 40 min to run on my pc depending on whether it is the 20 cm or 40 cm length soil monolith.
Using the asctotiffs function in 20220405_JP_DynamicporosityfromMLT_RscriptFunctions.R That gets three .tiff files from the matrix (DEM, macropore, RBG). The function also corrects when the tray dries out and exposes the bottom of the tray which lead to issues in the detection of pores.
Using the code in 20220405_JP_tiff_horizonProcessing_Rscript.R to find the angle of rotation needed to straightened the image. Traditional rotation via R packages or imageJ leads to problems dealing with the discrete nature of images. Instead we opt to go back to the point cloud and do a rotation of the point cloud data before aggregating into a matrix that gives the images.
The EBImage package was used to make this code and if not supported in the future, it is saved in a version control. Contact Julio Pachon (julio.pachon@ucr.edu) to obtain.
Using EBImage package:
Install “EBImage” using the two lines below:
install.packages(“BiocManager”)
BiocManager::install(“EBImage”, force = TRUE)
Use the Rotating_ascFile function in 20220405_JP_DynamicporosityfromMLT_RscriptFunctions.R
Use code in 20220405_JP_tiff_horizonProcessing_Rscript.R to cut the images into the different horizons desired.
ImageJ macro 20220419_Analyze-and-Measure_Macro.ijm creates the Analyze and Measure csv files that provide quantitaive information of the pores in the image.
The function has the following inputs:
ascname: character; file name of the point cloud data in the .asc format outputted from RapidWorks; this file contains columns of data, the first three of which, refer to the x, y, z coordinates of the points; additional columns refer to the RGB values associated with these points and the normals associated with these points
resol: numeric; value that represents the resolution of the resulting matrix grid in mm
dname: character; folder where the asc file is stored if not in the default folder
oname: character; folder where the tiff file is to be outputted if not in the the default folder
date: logical; TRUE means output the system date in the tiff filename (default); FALSE means not to output the date in the file name
NOTE: There are two large loops in this function whose progress are displayed as two progress bars
library("tiff")
library("raster")
## Loading required package: sp
library("EBImage")
##
## Attaching package: 'EBImage'
## The following objects are masked from 'package:raster':
##
## flip, rotate
mltfile <- sub('.asc', '', ascname)
#
### Read in the asc file: ####
if (length(dname)==0) {
mlt <- read.table(ascname,header=FALSE,sep="\t")
if( length(names(mlt)) == 9){
names(mlt) <- c("x","y","z","R","G","B","a","b","c")}else{
names(mlt) <- c("x","y","z","R","G","B")
}
} else { # file is stored in another folder
currentwd <- getwd()
setwd(dname)
mlt <- read.table(ascname,header=FALSE,sep="\t")
if( length(names(mlt)) == 9){
names(mlt) <- c("x","y","z","R","G","B","a","b","c")}else{
names(mlt) <- c("x","y","z","R","G","B")
}
setwd(currentwd) # resets to the original working directory
}
This part of the code normalizes x, y, and z into positive values starting from 0 by subtracting the lowest value of each column and then assigning them to x and y bins of set widths (the resolution of the camera). This portion takes abou 20 min for a tray 20 cm by 30 cm and 40 min for a tray 40 cm by 30 cm.
### Define the origin based on the minimum data in each column: ####
nmlt <- mlt # save the "normalized" mlt values into a new dataframe
nmlt$x <- nmlt$x-min(mlt$x)
nmlt$y <- nmlt$y-min(mlt$y)
nmlt$z <- nmlt$z-min(mlt$z)
#
### Calculate the spatial extent of the data (in mm): ####
xdim <- max(nmlt$x)+resol # width
ydim <- max(nmlt$y)+resol # length
#
# Setup a grid based on the extent data:
xbound <- seq(0,xdim,by=resol) # provides the x coordinate for the left
# grid cell boundary
ybound <- seq(0,ydim,by=resol) # provides the y coordinate for the upper
# grid cell boundary
# Assign the column values of mltmat to a new dataframe, xmlt:
xmlt <- data.frame(nmlt,row=NA,col=NA) # switch of variables
xmlt <- xmlt[order(xmlt$x),] # sort the dataframe on the x values
row.names(xmlt) <- NULL
x <- round(xmlt$x,round(-log10(resol/10))) # rounds the x data to a tenth
# of the specified resolution; if the resolution is greater than 10 this
# will cap the rounding to a single mm
dummy_mat <- matrix(0,nrow=length(ybound),ncol=length(xbound)) # matrix to
# store the answer to the question: Is there a point in this grid cell? A:
# Yes (1) or No (0); Note: columns = x, rows = y
pb <- txtProgressBar(min = 0, max = (ncol(dummy_mat)-1), style = 3) # create
# progress bar
for (i in 1:(ncol(dummy_mat)-1)) {
xmlt$col[which((x > xbound[i]) & (x <= xbound[i+1]))] <- i
Sys.sleep(0.1)
setTxtProgressBar(pb, i) # update progress bar
}
close(pb)
#
# Assign the row values of mltmat to xmlt:
xmlt <- xmlt[order(xmlt$y),] # sort the dataframe on the y values
row.names(xmlt) <- NULL
y <- round(xmlt$y,round(-log10(resol/10)))
pb <- txtProgressBar(min = 0, max = (nrow(dummy_mat)-1), style = 3) # create
# progress bar
for (i in 1:(nrow(dummy_mat)-1)) {
xmlt$row[which((y > ybound[i]) & (y <= ybound[i+1]))] <- i
Sys.sleep(0.1)
setTxtProgressBar(pb, i) # update progress bar
}
close(pb)
#
# Create a new rowcol dataframe and order by rows then cols:
rowcol <- xmlt[,c("row","col")]
rowcol <- rowcol[order(rowcol$row, rowcol$col),]
row.names(rowcol) <- NULL
rowcol <- unique(rowcol) #rowcol[!duplicated(rowcol),] # removes any duplicated cells
row.names(rowcol) <- NULL
This portion of the code gets all the values under the same x, and y assigned in the previous section and aggregates then by the mean and round the answer to the nearest interger.
#Aggregating data ####
# Calculates the mean height (z) and R, G, B for each cell:
xmlt <- xmlt[order(xmlt$row, xmlt$col),]
row.names(xmlt) <- NULL
xmlt <- data.frame(xmlt, # used to aggregate the data below
matid = as.factor(paste(xmlt$row,xmlt$col,sep="_")))
xmlta <- aggregate(cbind(z, R, G, B) ~ matid, xmlt, FUN = "mean")
xmlta$R <- round(xmlta$R,0) # rounds to the nearest integer
xmlta$G <- round(xmlta$G,0) # rounds to the nearest integer
xmlta$B <- round(xmlta$B,0) # rounds to the nearest integer
xmltm <- merge(xmlt[,c("row","col","matid")],xmlta,by="matid") # holds
### the row and col indices and the mean values for z, R, G, and B ####
xmltm <- xmltm[,c("row","col","z","R","G","B")] # removes the matid col
xmltm <- xmltm[-which(is.na(xmltm$col) | is.na(xmltm$row)),] # removes the rows or columns with NA values
### removes any duplicated cells####
xmltm <- xmltm[!duplicated(xmltm),]
xmltm <- xmltm[order(xmltm$row, xmltm$col),] # order the data.frame
row.names(xmltm) <- NULL
head(xmltm)
## row col z R G B
## 1 1 366 39.31686 88 79 76
## 2 1 386 38.30772 135 115 121
## 3 1 388 38.33679 137 116 122
## 4 1 389 38.20964 134 113 119
## 5 1 390 38.29087 152 131 136
## 6 1 391 38.24664 140 121 124
head(rowcol)
## row col
## 1 1 366
## 2 1 386
## 3 1 388
## 4 1 389
## 5 1 390
## 6 1 391
head(xbound)
## [1] 0.00 0.18 0.36 0.54 0.72 0.90
head(ybound)
## [1] 0.00 0.18 0.36 0.54 0.72 0.90
mltfile
## [1] "../Data/MLT/mlt_KONZA_PPT_S7494_T3"
Given how long this portion can take, values needed (xmltm, ybound, xbound , rowcol, mltfile) are saved for the second function under “../Output Files/202202_updated code/Tray_ascprocessing/” as an .Rdata file. By separating this portion form the tiff renderings, problems arising from the DEM correction or questions on aggregation may be solved without re running the previous code.
if (length(oname)==0) {
if (date == FALSE) {
save(xmltm, ybound, xbound , rowcol,mltfile,
file= paste(mltfile,'_',
resol,
'mmResolution',
'_aggregatedASCData.Rdata',sep=''))
} else { # include the date in the file name
save(xmltm, ybound, xbound , rowcol,mltfile,
file= paste(gsub("-", "", as.character(Sys.Date())),
'_',mltfile,'_',resol,'mmResolution',
'_aggregatedASCData.Rdata',sep=''))
}
} else { # store the tiff in a different folder
currentwd <- getwd()
setwd(oname)
if (date == FALSE) {
save(xmltm, ybound, xbound , rowcol,mltfile,
file=paste(mltfile,'_',resol,'mmResolution',
'_aggregatedASCData.Rdata',sep=''))
} else { # include the date in the file name
save(xmltm, ybound, xbound , rowcol,mltfile,
file=paste(gsub("-", "", as.character(Sys.Date())),
'_',mltfile,'_',resol,'mmResolution',
'_aggregatedASCData.Rdata',sep=''))
}
setwd(currentwd) # resets to the original working directory
}
Multiple .asc. files may be processed at the same time using the following code: Note: make sure not to use more than 1/2 of your cores or your computer will crash.
library(tiff)
library(matlab)
library(parallel)
param_files <- list.files("../Data/MLT/", pattern = "*.asc")
ncores <- detectCores(logical = FALSE)
cl <- makeCluster(ncores/2)
ascname <- paste0("../Data/MLT/",param_files)
start <- Sys.time()
start
clusterApply(cl, ascname , Aggregating_asc)
stopCluster(cl)
(end <- Sys.time())
end-start
The function has the following inputs:
rdata_file: the .rdata output file from the Aggregating_asc function
resol: numeric; value that represents the resolution of the resulting matrix grid in mm
tray_depth_cm: depth of the tray which is used in the correction of z values (more details below).
depth_from_tray_bottom_mm: The threshold to correct z values for the macropore tiff. The default is 5 mm but 10 mm has also been used. If looking at images for the first time, set at 0.
dname: character; folder where the asc file is stored if not in the default folder
oname: character; folder where the tiff file is to be outputted if not in the the default folder
date: logical; TRUE means output the system date in the tiff filename (default); FALSE means not to output the date in the file name
###RGB tiff:
rdata_file= "../Output Files/202202_updated code/Tray_ascprocessing/Unrotated/mlt_KONZA_PPT_S7494_T3_0.18mmResolution_aggregatedASCData.Rdata"
load(rdata_file)
resol=0.18
tray_depth_cm=2
depth_from_tray_bottom_mm=5
library(matlab)
##
## Attaching package: 'matlab'
## The following object is masked from 'package:stats':
##
## reshape
## The following objects are masked from 'package:utils':
##
## find, fix
## The following object is masked from 'package:base':
##
## sum
library(compiler)
## This rotate portion is a vestige of the original code and may be removed in future iterations
# Function for rotating a matrix #####
rotate <- function(x,direction="clockwise") {
# Rotates a matrix either clockwise or counter clockwise as specified
# by direction
# x = matrix
# direction = either 'clockwise' or 'counter' (any value other than
# 'clockwise' will rotate counter clockwise)
if (direction == "clockwise") {
t(apply(x, 2, rev))
} else {
apply(t(x), 2, rev)
}
}
cmp_rotate <- cmpfun(rotate)
# Place the mean values into the correct matrices: ####
##RGB matrix####
rgbmat <- array(NA,c(length(ybound),length(xbound),3)) # array to
# store the answer to the question: What is the color of this grid cell?
# Note: columns = x, rows = y
# the single index notation for matrices
rgbmat <- rgbmat[1:(nrow(rgbmat[,,1])-1),1:(ncol(rgbmat[,,1])-1),]
rgbmat[,,1][xmltm$row + nrow(rgbmat) * (xmltm$col - 1)] <- xmltm$R # uses
# the single index notation for matrices
rgbmat[,,2][xmltm$row + nrow(rgbmat) * (xmltm$col - 1)] <- xmltm$G # uses
# the single index notation for matrices
rgbmat[,,3][xmltm$row + nrow(rgbmat) * (xmltm$col - 1)] <- xmltm$B # uses
# the single index notation for matrices
dimhold <- dim(cmp_rotate(t(rgbmat[,,1]), direction = 'counter'))
colormat <- array(NA,c(dimhold[1],dimhold[2],3)) # creates a new array to
# hold the correctly oriented rgb values
colormat[,,1] <- cmp_rotate(t(rgbmat[,,1]), direction = 'counter')/255
colormat[,,2] <- cmp_rotate(t(rgbmat[,,2]), direction = 'counter')/255
colormat[,,3] <- cmp_rotate(t(rgbmat[,,3]), direction = 'counter')/255
Output of the mlt_KONZA_PPT_S7494_T3 RGB code section
### Save the resulting matrices as tiff files: ####
if (length(oname)==0) {
if (date == FALSE) {
writeTIFF(colormat,paste(mltfile,'_',resol,'mmResolution',
'_RGB.tiff',sep=''))
} else { # include the date in the file name
writeTIFF(colormat,paste(gsub("-", "", as.character(Sys.Date())),
'_',mltfile,'_',resol,'mmResolution',
'_RGB.tiff',sep='')) }
} else { # store the tiff in a different folder
currentwd <- getwd()
setwd(oname)
if (date == FALSE) {
writeTIFF(colormat,paste(mltfile,'_',resol,'mmResolution',
'_RGB.tiff',sep=''))
} else { # include the date in the file name
writeTIFF(colormat,paste(gsub("-", "", as.character(Sys.Date())),
'_',mltfile,'_',resol,'mmResolution',
'_RGB.tiff',sep=''))
}
setwd(currentwd)
###DEM correction
demmat <- matrix(NA,nrow=length(ybound),ncol=length(xbound)) # matrix to
# store the answer to the question: What is the z of this grid cell?
# Note: columns = x, rows = y
demmat <- demmat[1:(nrow(demmat)-1),
1:(ncol(demmat)-1)] # remove the last row and col
# in demmat; Note: these are removed because they represent the right
# or bottom extent of the last full cells
demmat[xmltm$row + nrow(demmat) * (xmltm$col - 1)] <- xmltm$z # uses
# Fix the orientation of the following matrices to match mltmat:
demmat <- cmp_rotate(t(demmat), direction = 'counter')
dimhold <- dim(cmp_rotate(t(rgbmat[,,1]), direction = 'counter'))
colormat <- array(NA,c(dimhold[1],dimhold[2],3)) # creates a new array to
# hold the correctly oriented rgb values
colormat[,,1] <- cmp_rotate(t(rgbmat[,,1]), direction = 'counter')/255
colormat[,,2] <- cmp_rotate(t(rgbmat[,,2]), direction = 'counter')/255
colormat[,,3] <- cmp_rotate(t(rgbmat[,,3]), direction = 'counter')/255
#
### Assigns color values to the DEM matrix to output it as a tiff: ####
zRGB <- col2rgb(jet.colors(100)[as.numeric(cut(xmltm$z,breaks=100))])
zcol <- array(NA,c(dim(colormat)[1],dim(colormat)[2],dim(colormat)[3]))
zcol[,,1][xmltm$row + nrow(zcol) * (xmltm$col - 1)] <- zRGB[1,] # uses
# the single index notation for matrices (R)
zcol[,,2][xmltm$row + nrow(zcol) * (xmltm$col - 1)] <- zRGB[2,] # uses
# the single index notation for matrices (G)
zcol[,,3][xmltm$row + nrow(zcol) * (xmltm$col - 1)] <- zRGB[3,] # uses
# the single index notation for matrices (B)
dimhold <- dim(cmp_rotate(t(zcol[,,1]), direction = 'counter'))
demcol <- array(NA,c(dimhold[1],dimhold[2],3)) # creates a new array to
# hold the correctly oriented rgb values
demcol[,,1] <- cmp_rotate(t(zcol[,,1]), direction = 'counter')/255
demcol[,,2] <- cmp_rotate(t(zcol[,,2]), direction = 'counter')/255
demcol[,,3] <- cmp_rotate(t(zcol[,,3]), direction = 'counter')/255
#
###view original DEM####
image(as.matrix(demmat), useRaster=TRUE, axes=FALSE, col = hcl.colors(500, "spectral"))
The code below corrects 2 things:
DEM gradient
The red and orange points are reflections of the bottom of the tray that should be white.
#Finding the values of the rulers using the corners
rightMost <- dim(demmat)[2]
bottomMost <- dim(demmat)[1]
Top_left <- max(demmat[1:300,1:300], na.rm = TRUE)
Top_right <- max(demmat[1:300,(rightMost-300):rightMost], na.rm = TRUE)
Bottom_left <- max(demmat[(bottomMost-300):bottomMost,1:300], na.rm = TRUE)
Bottom_right <- max(demmat[(bottomMost-300):bottomMost,(rightMost-300):rightMost], na.rm = TRUE)
Reg_table <- data.frame()
Reg_table[1,1:3] <- xmltm[xmltm$z==Top_left,1:3]
Reg_table[2,1:3] <- xmltm[xmltm$z==Top_right,1:3]
Reg_table[3,1:3] <- xmltm[xmltm$z==Bottom_left,1:3]
Reg_table[4,1:3] <- xmltm[xmltm$z==Bottom_right,1:3]
#finding the gradient and correcting for it
reg_eq <- lm(z~row+col,Reg_table)
Modified_Z <- xmltm$z-predict(reg_eq,xmltm)+reg_eq[[1]][1]
# Creating the modified DEM
mod_demmat <- matrix(NA,nrow=length(ybound),ncol=length(xbound)) # matrix to
# store the answer to the question: What is the z of this grid cell?
# Note: columns = x, rows = y
mod_demmat <- mod_demmat[1:(nrow(mod_demmat)-1),
1:(ncol(mod_demmat)-1)] # remove the last row and col
# in mod_demmat; Note: these are removed because they represent the right
# or bottom extent of the last full cells
mod_demmat[xmltm$row + nrow(mod_demmat) * (xmltm$col - 1)] <- Modified_Z
mod_demmat <- cmp_rotate(t(mod_demmat), direction = 'counter')
#comparing images
par(mfrow = c(1,2))
image(as.matrix(demmat), useRaster=TRUE, axes=FALSE, col = hcl.colors(500, "spectral"))
image(as.matrix(mod_demmat), useRaster=TRUE, axes=FALSE, col = hcl.colors(500, "spectral"))
### Save the resulting Corrected DEM as tiff files: ####
if (length(oname)==0) {
if (date == FALSE) {
writeTIFF(mod_demcol,paste(mltfile,'_',
resol,
'mmResolution',
'_DEM.tiff',sep=''))
} else { # include the date in the file name
writeTIFF(mod_demcol,paste(gsub("-", "", as.character(Sys.Date())),
'_',mltfile,'_',resol,'mmResolution',
'_DEM.tiff',sep=''))
}
} else { # store the tiff in a different folder
currentwd <- getwd()
setwd(oname)
if (date == FALSE) {
writeTIFF(mod_demcol,paste(mltfile,'_',resol,'mmResolution',
'_DEM.tiff',sep=''))
} else { # include the date in the file name
writeTIFF(mod_demcol,paste(gsub("-", "", as.character(Sys.Date())),
'_',mltfile,'_',resol,'mmResolution',
'_DEM.tiff',sep=''))
}
setwd(currentwd) # resets to the original working directory
Output of the mlt_KONZA_PPT_S7494_T3 DEM code section
The code below assumes that the top of the rulers are 2 cm off the bottom and uses that georeference to take out values below the depth_from_tray_bottom_mm threshold.
# matrix to store the answer to the question: Is there a point in this grid cell? A:
# Yes (1) or No (0); Note: columns = x, rows = y
mltmat <- matrix(0,nrow=length(ybound),ncol=length(xbound))
# Assign zero values in mltmat for the appropriate missing indices in
# rowcol:
mltmat <- mltmat[1:(nrow(mltmat)-1),
1:(ncol(mltmat)-1)] # remove the last row and col
# in mltmat_corrected; Note: these are removed because they represent the right
# or bottom extent of the last full cells
# uses the single index notation for matrices
#This is the uncorrected macropore matrix in which each pixel with a value gets a "1" and all others are "0"
mltmat[rowcol$row + nrow(mltmat) * (rowcol$col - 1)] <- 1
##Georefencing the rulers
rightMost <- dim(mod_demmat)[2]
bottomMost <- dim(mod_demmat)[1]
Top_left <- max(mod_demmat[1:300,1:300], na.rm = TRUE)
Top_right <- max(mod_demmat[1:300,(rightMost-300):rightMost], na.rm = TRUE)
Bottom_left <- max(mod_demmat[(bottomMost-300):bottomMost,1:300], na.rm = TRUE)
Bottom_right <- max(mod_demmat[(bottomMost-300):bottomMost,(rightMost-300):rightMost], na.rm = TRUE)
#This value is the z assumed to be 2 cm
Average_depth_rulers <- mean(c(Top_left,Top_right, Bottom_left, Bottom_right))
#z value meant to be depth_from_tray_bottom_mm
DEM_threshold <-depth_from_tray_bottom_mm*Average_depth_rulers/(tray_depth_cm*10)
# correction of the macropore matrix
Correction_sites <- which(mod_demmat<DEM_threshold)
mltmat <- cmp_rotate(t(mltmat), direction = 'counter')
mltmat_corrected <- mltmat
mltmat_corrected[Correction_sites] <- 0
image(as.matrix(mod_demmat), useRaster=TRUE, axes=FALSE, col = hcl.colors(500, "spectral"))
image(as.matrix(mltmat), useRaster=TRUE, axes=FALSE, col = hcl.colors(500, "grays"))
image(as.matrix(mltmat_corrected), useRaster=TRUE, axes=FALSE, col = hcl.colors(500, "grays"))
### Saving Macropore tiff
if (length(oname)==0) {
if (date == FALSE) {
writeTIFF(mltmat_corrected,paste(mltfile,'_',
resol,
'mmResolution',
'_Macropores.tiff',sep=''))
} else { # include the date in the file name
writeTIFF(mltmat_corrected,paste(gsub("-", "", as.character(Sys.Date())),
'_',mltfile,'_',resol,'mmResolution',
'_Macropores.tiff',sep=''))
}
} else { # store the tiff in a different folder
currentwd <- getwd()
setwd(oname)
if (date == FALSE) {
writeTIFF(mltmat_corrected,paste(mltfile,'_',resol,'mmResolution',
'_Macropores.tiff',sep=''))
} else { # include the date in the file name
writeTIFF(mltmat_corrected,paste(gsub("-", "", as.character(Sys.Date())),
'_',mltfile,'_',resol,'mmResolution',
'_Macropores.tiff',sep=''))
}
setwd(currentwd) # resets to the original working directory
Given how quick this function is, looping through files in a folder is done quickly without the need to parallel
rdata_files <- list.files("../Output Files/202202_updated code/Tray_ascprocessing/", pattern = "*.Rdata")
for (i in 1:length(rdata_files)){
rdata_file <- paste0("../Output Files/202202_updated code/Tray_ascprocessing/",rdata_files[c(i)])
asctotiffs(rdata_file, depth_from_tray_bottom_mm = 5)
}
EBIimage allows for the manipulation of the tiff in R to zoom in (scrolling with the mouse or the top right buttons) and view the location of the top and left boundaries of the trays by hovering with the mouse over their location and reading the coordiantes in the lower portion.
display(RBG)
display(DEM)
display(Macropores)
Screenshot of the view of EBIviewer
If the image is not done correctly (check that the ruler shows T at the top, L on the left, B on the bottom, R on the right), rotate to correct that.
Orientation_change= -90
RBG = rotate(RBG, Orientation_change, bg.col = "black")
DEM = rotate(DEM, Orientation_change, bg.col = "black")
Macropores = rotate(Macropores, Orientation_change, bg.col = "black")
Further correction can be done with basic trig. Find a place along the picture that should be straight and use the position to solve for the angle of distortion. In this example, two points of the left ruler are used {(121, 168) and (144,1233)}.
(angle <- atan((144-121)/(1233-168))*180/pi)
RBG_rotate = rotate(RBG, angle, bg.col = "black")
DEM_rotate = rotate(DEM, angle, bg.col = "black")
Macropores_rotate = rotate(Macropores, angle, bg.col = "black")
Rotated RBG image
There are issues in rotating the images by small angles. Take the macropore tiff, for example. It is a binary file with 0 and 1, however rotation by small angles will lead to a different number of 0s and 1s. Discussions with Daniel Hirmas led to rotating the point cloud in the .asc files.
The Rotating_ascFile function is an update to the Aggregating_asc function that includes the hability to rotate the values in the point cloud, meanin that Aggregating_asc can be depreciated it.
The added code simply does:
###Change to Polar Coordinates ####
nmlt$r <- sqrt((nmlt$y)^2+(nmlt$x)^2)
nmlt$theta <- atan((nmlt$y)/(nmlt$x))
### Perform polar rotation ####
angle_rad <- angle*(pi/180)
nmlt$theta_rotated <- (nmlt$theta-angle_rad)
### Change back to cartesian ####
nmlt$rotated_y <- nmlt$r*sin(nmlt$theta_rotated)
nmlt$rotated_x <- nmlt$r*cos(nmlt$theta_rotated)
nmlt$rotated_x <- nmlt$rotated_x-min(nmlt$rotated_x)
nmlt$rotated_y <- nmlt$rotated_y-min(nmlt$rotated_y)
With the code occurring right after the x, y, z values are made positive and before creating the matrix as explained in section 1.
After running the Rotating_ascFile function, perform the asctotiffs again.
At this point, the code 20220405_JP_tiff_horizonProcessing_Rscript.R is run line by line given the subjective decisions that are made, however metadata on these decisions is recorded and the procedure can be replicated.
After reading in the images, use the EBI viewer find the column and row for the top and left tray as well as choose the buffer to be cut from the edges to minimize problems due to edge effect.
#Basic Info- check by using the RBG file
Length_tray_mm <- 200
Width_tray_mm <-300
Resolution_mm <- 0.18 #resolution of the camera, which is also the size used in the aggregation when transforming from .asc to .tiff files.
Top_tray <- 160 #row
Left_tray <- 147 #col
Bot_crop_mm <- 20
Left_crop_mm <- 20
Top_crop_mm <- 20
Right_crop_mm <- 10
length_tray_pixel <- Length_tray_mm/Resolution_mm #300 mm
Width_tray_pixel <- Width_tray_mm/Resolution_mm
Top_crop_pixels <- Top_crop_mm/Resolution_mm
Bot_crop_pixels <- Bot_crop_mm/Resolution_mm
Left_crop_pixels <- Left_crop_mm/Resolution_mm
Righ_crop_pixels <- Right_crop_mm/Resolution_mm
RBG_crop <- RBG_rotate[
(Left_tray+Left_crop_pixels):(Left_tray+Width_tray_pixel-Righ_crop_pixels),
(Top_tray+Top_crop_pixels):(Top_tray+length_tray_pixel-Bot_crop_pixels),]
Macropore_crop <- Macropores_rotate[
(Left_tray+Left_crop_pixels):(Left_tray+Width_tray_pixel-Righ_crop_pixels),
(Top_tray+Top_crop_pixels):(Top_tray+length_tray_pixel-Bot_crop_pixels)]
DEM_crop<- DEM_rotate[
(Left_tray+Left_crop_pixels):(Left_tray+Width_tray_pixel-Righ_crop_pixels),
(Top_tray+Top_crop_pixels):(Top_tray+length_tray_pixel-Bot_crop_pixels),]
View the resulting cropped trays and modify the Bot_crop_mm, Left_crop_mm, Top_crop_mm, and Right_crop_mm as needed.
display(RBG_crop)
display(Macropore_crop)
display(DEM_crop)
Screenshot of the view of cropeed tiffs
Had this been a function or a loop, a metadata dataframe would be populated and exported. At his point, data is placed into a dataframe and exported to the clipboard so it can be pasted on an excel or googlesheet where additionally, the depth_from_tray_bottom_mm is added.
File can be found here:
../Output Files/Metadata/Reference Whole Trays.gslides
Metadata_wholetrays.jpg
Using the information collected in the field of about the depths of different soil horizons, you can crop the tiffs into the appropriate sizes and record how much is cut off each side. The code is capable of choosing the correct cutoff if Bot_crop_mm, Left_crop_mm, Top_crop_mm, Right_crop_mm, tray_top_depth, Tray_bottom_depth, Horizon_depth_top, and Horizon_depth_bottom are set.
tray_top_depth <- 74
Tray_bottom_depth <- 94
Horizon_depth_top <- 68
Horizon_depth_bottom <- 103
if(Horizon_depth_top>tray_top_depth+ (Top_crop_pixels*Resolution_mm/10)){
Top_cut_pixels <- Top_tray+((Horizon_depth_top-tray_top_depth)*10/Resolution_mm)
} else {
Top_cut_pixels <- Top_tray+Top_crop_pixels
}
if(Tray_bottom_depth>Horizon_depth_bottom- (Bot_crop_pixels*Resolution_mm/10)){
Bottom_cut_pixels <- Top_tray+length_tray_pixel-((Tray_bottom_depth-Horizon_depth_bottom)*10/Resolution_mm)
} else {
Bottom_cut_pixels <- Top_tray+length_tray_pixel-Bot_crop_pixels
}
The following code meshes the rulers and the horizon, and is a good point to double check that it is cut correctly:
top_ruler <- 1:(Top_tray-25)
bottom_ruler <- (Top_tray+length_tray_pixel+25):dim(RBG_rotate)[2]
Horizon_RBG_test <- RBG_rotate[c(1:(Left_tray-25),
rep(1,20), #creates some space between ruler and the horizon
(Left_tray+Left_crop_pixels):(Left_tray+Width_tray_pixel-Righ_crop_pixels),
rep(1,20),
(Left_tray+Width_tray_pixel+25):dim(RBG_rotate)[1]),
c(top_ruler,
rep(1,20),
Top_cut_pixels:Bottom_cut_pixels,
rep(1,20),
bottom_ruler),]
display(Horizon_RBG_test)
Screenshot of Horizon with rulers
If it looks good, cut the tiffs and export.
Horizon_RBG <- RBG_rotate[(Left_tray+Left_crop_pixels):(Left_tray+Width_tray_pixel-Righ_crop_pixels),
Top_cut_pixels:Bottom_cut_pixels,]
display(Horizon_RBG)
Horizon_Macropores <- Macropores_rotate[(Left_tray+Left_crop_pixels):(Left_tray+Width_tray_pixel-Righ_crop_pixels),
Top_cut_pixels:Bottom_cut_pixels]
display(Horizon_Macropores)
Horizon_DEM <- DEM_rotate[(Left_tray+Left_crop_pixels):(Left_tray+Width_tray_pixel-Righ_crop_pixels),
Top_cut_pixels:Bottom_cut_pixels,]
display(Horizon_DEM)
# #Save Horizon cropped tiffs ####
writeImage(Horizon_RBG,paste0("../Output Files/202202_updated code/Cropped_horizons/",horizon_name,'_RBG_CroppedHorizon.tiff'))
writeImage(Horizon_DEM,paste0("../Output Files/202202_updated code/Cropped_horizons/",horizon_name,'_DEM_CroppedHorizon.tiff'))
writeImage(Horizon_Macropores,paste0("../Output Files/202202_updated code/Cropped_horizons/",horizon_name,'_Macropores_CroppedHorizon.tiff'))
Metadata is similarly exported via the clipboard and placed into a different excel/googlesheet:
horizon_name <- paste0("mlt_", "KONZA_PPT_Btss1_0.18mmResolution", "_T3_76-92cm")
Horizon_metadata <- data.frame(
file=horizon_name,
hor="KONZA_PPT_Btss1_0.18mmResolution",
tray=param_files[3*n],
Width_tray_cm=Width_tray_mm/10,
Length_tray_cm=Length_tray_mm/10,
rotation_angle_degrees=angle,
bottom_crop_fromBottomTray_cm=(Top_tray+length_tray_pixel-Bottom_cut_pixels)*Resolution_mm/10,
top_crop_toptray_cm=(Top_cut_pixels-Top_tray)*Resolution_mm/10,
left_crop_fromLeftTray_cm=Left_crop_mm/10,
Right_crop_fromRightTray_cm=Right_crop_mm/10,
left_tray_col=Left_tray,
Top_tray_row=Top_tray)
write.table(Horizon_metadata , "clipboard-16384", sep="\t", row.names=FALSE)
ImageJ Macro provides the following statistics on each pore of the cropped macropore tiff:
Area, Mean, StdDev, Mode, Min, Max, X, Y, XM, YM, Perim., BX, BY, Width, Height, Major, Minor, Angle, Circ. Feret, IntDen, Median, Skew, Kurt, %Area, RawIntDen, Slice, FeretX, FeretY, FeretAngle, MinFeret, AR, Round, and Solidity.
More information on these variables on the ImageJ website: https://imagej.nih.gov/ij/docs/menus/analyze.html
The macro is a simple codification of the SOP in the MLT_ImageJ.docx document provided to me by Victoria Moreno on 03/02/2022.
The code goes as follows:
input=getDirectory(“Choose Source Directory”); list = getFileList(input); dir=getDirectory(“Choose Results Directory”);
for (i = 0; i < list.length; i++) Particle_analyze(input, list[i]);
function Particle_analyze(input,filename){ open (input + filename);
// Assign raw opened image the name “currentImage” currentImage=getImageID();
run(“Set Scale…”, “distance=5.5556 known=1 pixel=1 unit=mm global”); run(“Set Measurements…”, “area mean standard modal min centroid center perimeter bounding fit shape feret’s integrated median skewness kurtosis area_fraction stack limit redirect=None decimal=3”); //since we output a binary file from the R script, white is 255 and black is 0; the threshold below should recognize all the pores setThreshold(0, 250);
run(“Analyze Particles…”, “display clear overlay”);
originalName = getTitle(); originalNameWithoutExt = replace( originalName , “.tiff” , “” ); resultName = originalNameWithoutExt + “_Analyze.csv”; saveAs(“Results”, dir+resultName); run(“Clear Results”);
run(“Measure”);
//Change the file name here resultName = originalNameWithoutExt + “_Measure.csv”; saveAs(“Results”, dir+resultName); run(“Clear Results”);
run(“*“);
}
To run the macro:
Place all the cropped Macropore tiff files into an individual folder.
Open ImageJ, open the macro (Plugins > Macros > Run…)
Screenshot of Horizon with rulers
Find the location of the 20220419_Analyze-and-Measure_Macro.ijm file and open it.
Select the folder where the cropped Macropore tiff files are at.
Select a folder where the Analyze and Measure csv files will be placed.
Close out from the ImageJ program.
Version control has been implemented for this code using the renv package. More information at: https://www.rstudio.com/blog/renv-project-environments-for-r/
Make sure you get the renv folder in the project folder and excecute “renv::init()” to run with the same exact R packages.
sessionInfo()
## R version 4.1.2 (2021-11-01)
## Platform: x86_64-w64-mingw32/x64 (64-bit)
## Running under: Windows 10 x64 (build 19044)
##
## Matrix products: default
##
## locale:
## [1] LC_COLLATE=English_United States.1252
## [2] LC_CTYPE=English_United States.1252
## [3] LC_MONETARY=English_United States.1252
## [4] LC_NUMERIC=C
## [5] LC_TIME=English_United States.1252
##
## attached base packages:
## [1] compiler stats graphics grDevices datasets utils methods
## [8] base
##
## other attached packages:
## [1] matlab_1.0.2 EBImage_4.36.0 raster_3.5-15 sp_1.4-6 tiff_0.1-11
##
## loaded via a namespace (and not attached):
## [1] Rcpp_1.0.8.3 highr_0.9 bslib_0.3.1
## [4] BiocManager_1.30.16 jquerylib_0.1.4 bitops_1.0-7
## [7] tools_4.1.2 digest_0.6.29 jsonlite_1.8.0
## [10] evaluate_0.15 lattice_0.20-45 png_0.1-7
## [13] rlang_1.0.2 cli_3.2.0 rstudioapi_0.13
## [16] yaml_2.3.5 xfun_0.30 fastmap_1.1.0
## [19] terra_1.5-21 stringr_1.4.0 knitr_1.37
## [22] htmlwidgets_1.5.4 fftwtools_0.9-11 sass_0.4.0
## [25] locfit_1.5-9.5 grid_4.1.2 R6_2.5.1
## [28] jpeg_0.1-9 rmarkdown_2.13 magrittr_2.0.3
## [31] codetools_0.2-18 htmltools_0.5.2 BiocGenerics_0.40.0
## [34] abind_1.4-5 renv_0.15.4 stringi_1.7.6
## [37] RCurl_1.98-1.6